Orthogonal-least-squares regression: A unified approach for data modelling
نویسندگان
چکیده
A unified approach is proposed for data modelling that includes supervised regression and classification applications as well as unsupervised probability density function estimation. The orthogonalleast-squares regression based on the leave-one-out test criteria is formulated within this unified data-modelling framework to construct sparse kernel models that generalise well. Examples from regression, classification and density estimation applications are used to illustrate the effectiveness of this generic data-modelling approach for constructing parsimonious kernel models with excellent generalisation capability. & 2008 Elsevier B.V. All rights reserved.
منابع مشابه
Orthogonal-Least-Squares Forward Selection for Parsimonious Modelling from Data
The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to construct appropriate parsimonious models by the modelling process, and a basic principle in practical ...
متن کاملOrthogonal-Least-Squares Forward Selection for
The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to construct appropriate parsimonious models by the modelling process, and a basic principle in practical ...
متن کاملSparse Kernel Modelling: A Unified Approach
A unified approach is proposed for sparse kernel data modelling that includes regression and classification as well as probability density function estimation. The orthogonal-least-squares forward selection method based on the leave-one-out test criteria is presented within this unified data-modelling framework to construct sparse kernel models that generalise well. Examples from regression, cl...
متن کاملMulti-output regression using a locally regularised orthogonal least-squares algorithm - Vision, Image and Signal Processing, IEE Proceedings-
The paper considcrs data modelling using multi-output regression models. A locally regularised orthogonal least-squares (LROLS) algorithm is proposed for constructing sparse multi-output regression models that generalise well. By associating each regressor in the regression model with an individual regularisation parameter, the ability of the multi-output orthogonal least-squares (OLS) model se...
متن کاملKernel-based Data Modelling Using Orthogonal Least Squares Selection with Local Regularisation
Combining orthogonal least squares (OLS) model selection with local regularisation or smoothing leads to efficient sparse kernel-based data modelling. By assigning each orthogonal weight in the regression model with an individual regularisation parameter, the ability for the OLS model selection to produce a very parsimonious model with excellent generalisation performance is greatly enhanced.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neurocomputing
دوره 72 شماره
صفحات -
تاریخ انتشار 2009